Global Optimization of Neural Network Weights – A Simulation Study
نویسندگان
چکیده
Corresonding author: Dr. B. Wade Brorsen, Oklahoma State University, Department of Agricultural Economics, 414 Ag Hall, 74078; [email protected], ph. 405-744-6155. Abstract – The training of neural networks is a difficult optimization problem because of the nonconvex objective function. Therefore, as an alternative to local search search algorithms, many global search algorithms have been used to train neural networks. However, local search algorithms are more efficient with computational resources, therefore numerous random restarts with a local algorithm may be competitive with a global algorithm at obtaining a low value of the objective function. This study examines, through monte-carlo simulations, the relative efficiency of a local search algorithm to 8 stochastic global algorithms: 2 simulated annealing algorithms, 1 simple random stochastic algorithm, 1 genetic algorithm and 5 evolutionary strategy algorithms. The results show that even ignoring the computational requirements of the global algorithms, there is little evidence to support the use of the global algorithms examined in this paper for training neural networks.
منابع مشابه
Identification of Wind Turbine using Fractional Order Dynamic Neural Network and Optimization Algorithm
In this paper, an efficient technique is presented to identify a 2500 KW wind turbine operating in Kahak wind farm, Qazvin province, Iran. This complicated system dealing with wind behavior is identified by using a proposed fractional order dynamic neural network (FODNN) optimized with evolutionary computation. In the proposed method, some parameters of FODNN are unknown during the process of i...
متن کاملNeural Network Sensitivity to Inputs and Weights and its Application to Functional Identification of Robotics Manipulators
Neural networks are applied to the system identification problems using adaptive algorithms for either parameter or functional estimation of dynamic systems. In this paper the neural networks' sensitivity to input values and connections' weights, is studied. The Reduction-Sigmoid-Amplification (RSA) neurons are introduced and four different models of neural network architecture are proposed and...
متن کاملA Recurrent Network with Stochastic Weights
Stochastic neural networks for global optimization are usually built by introducing random uctuations into the network. A natural method is to use stochastic weights rather than stochastic activation functions. We propose a new model in which each neuron has very simple functionality but all the weights are stochastic. It is shown that the stationary distribution of the network uniquely exists ...
متن کاملGlobal Optimization Methods for Designing and Training Feedforward Artificial Neural Networks
This paper presents a new method that integrates tabu search, simulated annealing, genetic algorithms and backpropagation in both a pruning and constructive manner. The approach obtained promising results in the simultaneous optimization of an artificial neural network architecture and weights. With the proposed method, we investigate four cost functions for global optimization methods: the ave...
متن کامل